Kullback-Leibler Divergence and Mutual Information of Experiments in the Fuzzy Case
نویسنده
چکیده
The main aim of this contribution is to define the notions of Kullback-Leibler divergence and conditional mutual information in fuzzy probability spaces and to derive the basic properties of the suggested measures. In particular, chain rules for mutual information of fuzzy partitions and for Kullback-Leibler divergence with respect to fuzzy P-measures are established. In addition, a convexity of Kullback-Leibler divergence and mutual information with respect to fuzzy P-measures is studied.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملComparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملLecture Three - STAT 212a
The mutual information I(θ; X) between two random variables θ and X is defined as the Kullback-Leibler divergence between their joint distribution and the product of their marginal distributions. It is interpreted as the amount of information that X contains about θ.
متن کاملPotential Statistical Evidence in Experiments and Renyi Information
Recently Habibi et al. (2006) defined a pre-experimental criterion for the potential strength of evidence provided by an experiment, based on Kullback-Leibler distance. In this paper, we investigate the potential statistical evidence in an experiment in terms of Renyi distance and compare the potential statistical evidence in lower (upper) record values with that in the same number of ii...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Axioms
دوره 6 شماره
صفحات -
تاریخ انتشار 2017